507 research outputs found

    Coherent measures of the impact of co-authors in peer review journals and in proceedings publications

    Full text link
    This paper focuses on the coauthor effect in different types of publications, usually not equally respected in measuring research impact. {\it A priori} unexpected relationships are found between the total coauthor core value, mam_a, of a leading investigator (LI), and the related values for their publications in either peer review journals (jj) or in proceedings (pp). A surprisingly linear relationship is found: ma(j)+0.4  ma(p)=ma(jp) m_a^{(j)} + 0.4\;m_a^{(p)} = m_a^{(jp)} . Furthermore, another relationship is found concerning the measure of the total number of citations, AaA_a, i.e. the surface of the citation size-rank histogram up to mam_a. Another linear relationship exists : Aa(j)+1.36  Aa(p)=Aa(jp)A_a^{(j)} + 1.36\; A_a^{(p)} = A_a^{(jp)} . These empirical findings coefficients (0.4 and 1.36) are supported by considerations based on an empirical power law found between the number of joint publications of an author and the rank of a coauthor. Moreover, a simple power law relationship is found between mam_a and the number (rMr_M) of coauthors of a LI: marMμm_a\simeq r_M^{\mu}; the power law exponent μ\mu depends on the type (jj or pp) of publications. These simple relations, at this time limited to publications in physics, imply that coauthors are a "more positive measure" of a principal investigator role, in both types of scientific outputs, than the Hirsch index could indicate. Therefore, to scorn upon co-authors in publications, in particular in proceedings, is incorrect. On the contrary, the findings suggest an immediate test of coherence of scientific authorship in scientific policy processes.Comment: 22 pages; 2 Tables; 6 Figures; 38 references; prepared for Physica

    Econophysics in Belgium. The first (?) 15 years

    Full text link
    This reviews the econophysics activities in Belgium from my admittedly biased point of view. Unknown historical notes or facts are presented for the first time explaining the aims, whence evolution of the research papers and friendly connections with colleagues. Comments on endeavors are also provided. The lack of official, academic and private support is outlined.Comment: 6 pages, 79 refs.; written for "Econophysics", a special issue of "Science and Culture" (Kolkata, India) to celebrate 15 years of Econophysic

    A scientometrics law about co-authors and their ranking. The co-author core

    Full text link
    Rather than "measuring" a scientist impact through the number of citations which his/her published work can have generated, isn't it more appropriate to consider his/her value through his/her scientific network performance illustrated by his/her co-author role, thus focussing on his/her joint publications, - and their impact through citations? Whence, on one hand, this paper very briefly examines bibliometric laws, like the hh-index and subsequent debate about co-authorship effects, but on the other hand, proposes a measure of collaborative work through a new index. Based on data about the publication output of a specific research group, a new bibliometric law is found. Let a co-author CC have written JJ (joint) publications with one or several colleagues. Rank all the co-authors of that individual according to their number of joint publications, giving a rank rr to each co-author, starting with r=1r=1 for the most prolific. It is empirically found that a very simple relationship holds between the number of joint publications JJ by coauthors and their rank of importance, i.e. J1/rJ \propto 1/r. Thereafter, in the same spirit as for the Hirsch core, one can define a "co-author core", and introduce indices operating on an author. It is emphasized that the new index has a quite different (philosophical) perspective that the hh-index. In the present case, one focusses on "relevant" persons rather than on "relevant" publications. Although the numerical discussion is based on one case, there is little doubt that the law can be verified in many other situations. Therefore, variants and generalizations could be later produced in order to quantify co-author roles, in a temporary or long lasting stable team(s), and lead to criteria about funding, career measurements or even induce career strategies.Comment: REVISED VERSION : 3 figures, 13 pages, 82 references, 3 tables; post-conference paper for COST Action MP-0801, 'Physics of Competition and Conflict': In particular "Evaluating Science: Modern Scientometric Methods", in Sofia,May 201

    Econophysics of Stock and Foreign Currency Exchange Markets

    Get PDF
    Econophysics is a science in its infancy, born about ten years ago at this time of writing, at the crossing roads of physics, mathematics, computing and of course economics and finance. It also covers human sciences, because all economics is ultimately driven by human decision. From this human factor, econophysics has no hope to achieve the status of an exact science, but it is interesting to discover what can be achieved, discovering potential limits and trying try to push further away these limits. A few data analysis techniques are described with emphasis on the Detrended Fluctuation Analysis (DFADFA) and the Zipf Analysis Technique (ZATZAT). Information about the original data aresketchy, but the data concerns mainly the foreign currency exchange market. The robustness of the DFADFA technique is underlined. Additional remarks are given for suggesting further work. Models about financial value evolutions are recalled, again without going into elaborate work discussing typical agent behaviors, but rather with hopefully sufficient information such that the basic ingredients can be memorized before reading some of the vast literature on price formation. Crashes being spectacular phenomena retain our attention and do so through data analysis and basic intuitive models. A few statistical and microscopic models are outlined.Comment: intended as a review to be published in a book edited by B.K. Chakrabarti, A. Chakraborti and A. Chatterjee, Wiley-VCH, Berlin : 25 pages, 3 figures, 2 tables, 94 reference

    Toward fits to scaling-like data, but with inflection points & generalized Lavalette function

    Full text link
    Experimental and empirical data are often analyzed on log-log plots in order to find some scaling argument for the observed/examined phenomenon at hands, in particular for rank-size rule research, but also in critical phenomena in thermodynamics, and in fractal geometry. The fit to a straight line on such plots is not always satisfactory. Deviations occur at low, intermediate and high regimes along the log(xx)-axis. Several improvements of the mere power law fit are discussed, in particular through a Mandelbrot trick at low rank and a Lavalette power law cut-off at high rank. In so doing, the number of free parameters increases. Their meaning is discussed, up to the 5 parameter free super-generalized Lavalette law and the 7-parameter free hyper-generalized Lavalette law. It is emphasized that the interest of the basic 2-parameter free Lavalette law and the subsequent generalizations resides in its "noid" (or sigmoid, depending on the sign of the exponents) form on a semi-log plot; something incapable to be found in other empirical law, like the Zipf-Pareto-Mandelbrot law. It remained for completeness to invent a simple law showing an inflection point on a \underline{log-log plot}. Such a law can result from a transformation of the Lavalette law through xx \rightarrow log(xx), but this meaning is theoretically unclear. However, a simple linear combination of two basic Lavalette law is shown to provide the requested feature. Generalizations taking into account two super-generalized or hyper-generalized Lavalette laws are suggested, but need to be fully considered at fit time on appropriate data.Comment: 31 pages, 38 references, 22 figures, prepared for Journal of Applied Quantitative Method

    A biased view of a few possible components when reflecting on the present decade financial and economic crisis

    Full text link
    Is the present economic and financial crisis similar to some previous one? It would be so nice to prove that universality laws exist for predicting such rare events under a minimum set of realistic hypotheses. First, I briefly recall whether patterns, like business cycles, are indeed found, and can be modeled within a statistical physics, or econophysics, framework. I point to a simulation model for describing such so called business cycles, under exo- and endo-genous conditions I discuss self-organized and provoked crashes and their predictions. I emphasize the role of an of- ten forgotten ingredient: the time delay in the information flow. I wonder about the information content of financial data, its mis-interpretation and market manipulation.Comment: 13 pages; 70 refs.; a chapter prepared for "Polymorphic Crisis Readings on the Great Recession of the 21st century" edited by Roy Cerquet

    Logistic Modeling of a Religious Sect Features

    Full text link
    The financial characteristics of sects are challenging topics. The present paper concerns the Antoinist Cult community (ACC), which has appeared at the end of the 19-th century in Belgium, have had quite an expansion, and is now decaying. The historical perspective is described in an Appendix. Although surely of marginal importance in religious history, the numerical and analytic description of the ACC growth AND decay evolution per se should hopefully permit generalizations toward behaviors of other sects, with either longer life time, i.e. so called religions or churches, or to others with shorter life time. Due to the specific aims and rules of the community, in particular the lack of proselytism, and strict acceptance of only anonymous financial gifts, an indirect measure of their member number evolution can only be studied. This is done here first through the time dependence of new temple inaugurations, between 1910 and 1940. Besides, the community yearly financial reports can be analyzed. They are legally known between 1920 and 2000. Interestingly, several regimes are seen, with different time spans. The agent based model chosen to describe both temple number and finance evolutions is the Verhulst logistic function taking into account the limited resources of the population. Such a function remarkably fits the number of temple evolution, taking into account a no construction time gap, historically explained. The empirical Gompertz law can also be used for fitting this number of temple evolution data, as shown in an Appendix. It is thereby concluded that strong social forces have been acting both in the growth and decay phases.Comment: 21 pages, 8 figures, 57 references; an updated version will be published in 'Econophysics of Agent-Based Models: Proc. Econophys-Kolkata VII', Eds. F. Abergel et al, Springer (2014

    For A Lecture on Scientific Meteorology within Statistical ("Pure") Physics Concepts

    Full text link
    Various aspects of modern statistical physics and meteorology can be tied together. Critical comments have to be made. However, the historical importance of the University of Wroclaw in the field of meteorology should be first pointed out. Next, some basic difference about time and space scales between meteorology and climatology can be outlined. The nature and role of clouds both from a geometric and thermal point of view are recalled. Recent studies of scaling laws for atmospheric variables are mentioned, like studies on cirrus ice content, brightness temperature, liquid water path fluctuations, cloud base height fluctuations, .... Technical time series analysis approaches based on modern statistical physics considerations are outlined.Comment: the originally extended version of http://arXiv.org/abs/physics/0401066 as presented at the 18-th Max Born Symposium entitled "*Physics: Statistical Physics outside Physics", Ladek Zdroj, Poland, September 22-25, 200

    Two-exponent Lavalette function. A generalization for the case of adherents to a religious movement

    Full text link
    The Lavalette function is generalized to a 2-exponent function in order to represent data looking like a sigmoid on semi-log plots. A Mandelbrot trick is suggested for further investigations, if more fit parameters are needed. The analyzed data is that of the number of adherents to the main religions in the XXth century.Comment: 4 pages, 2 columns, 4 figures, 20 reference

    Assessing the true role of coauthors in the h-index measure of an author scientific impact

    Full text link
    A method based on the classical principal component analysis leads to demonstrate that the role of co-authors should give a h-index measure to a group leader higher than usually accepted. The method rather easily gives what is usually searched for, i.e. an estimate of the role (or "weight") of co-authors, as the additional value to an author papers' popularity. The construction of the co-authorship popularity H-matrix is exemplified and the role of eigenvalues and the main eigenvector component are discussed. An example illustrates the points and serves as the basis for suggesting a generally practical application of the concept.Comment: 13 pages ; 40 ref
    corecore